# Korean-English Bilingual Generation
GECKO 7B
Apache-2.0
GECKO is a 7-billion-parameter decoder-only Transformer model trained on Korean, English, and code, released under the Apache 2.0 license.
Large Language Model
Transformers Supports Multiple Languages

G
kifai
43
12
OPEN SOLAR KO 10.7B GGUF
Apache-2.0
This is a GGUF-format quantized version of the beomi/OPEN-SOLAR-KO-10.7B model, supporting 2-8 bit quantization levels, suitable for Korean and English text generation tasks.
Large Language Model Supports Multiple Languages
O
MaziyarPanahi
86
1
Ko En Llama2 13b
A Korean-English bilingual autoregressive language model based on LLaMA2-13B architecture, focusing on Korean corpus learning while retaining English capabilities
Large Language Model
Transformers Korean

K
hyunseoki
1,850
27
42dot LLM PLM 1.3B
42dot LLM-PLM is a pre-trained language model developed by 42dot, supporting Korean and English text generation tasks.
Large Language Model
Transformers Supports Multiple Languages

4
42dot
130
32
Kollama 33b
MIT
KoLLaMA is a large language model based on the LLaMA architecture, trained on Korean/English/code datasets, focusing on Korean open-source large language model research.
Large Language Model
Transformers Supports Multiple Languages

K
beomi
55
9
Ke T5 Base Ko
Apache-2.0
KE-T5 is a Korean-English bilingual text generation model based on the T5 architecture, developed by the Korea Electronics Technology Institute, supporting cross-lingual knowledge transfer for dialogue generation tasks.
Large Language Model Korean
K
KETI-AIR
208
9
Featured Recommended AI Models